Functional Frank-Wolfe Boosting for General Loss Functions

نویسندگان

  • Chu Wang
  • Yingfei Wang
  • Weinan E
  • Robert E. Schapire
چکیده

Boosting is a generic learning method for classification and regression. Yet, as the number of base hypotheses becomes larger, boosting can lead to a deterioration of test performance. Overfitting is an important and ubiquitous phenomenon, especially in regression settings. To avoid overfitting, we consider using l1 regularization. We propose a novel Frank-Wolfe type boosting algorithm (FWBoost) applied to general loss functions. By using exponential loss, the FWBoost algorithm can be rewritten as a variant of AdaBoost for binary classification. FWBoost algorithms have exactly the same form as existing boosting methods, in terms of making calls to a base learning algorithm with different weights update. This direct connection between boosting and Frank-Wolfe yields a new algorithm that is as practical as existing boosting methods but with new guarantees and rates of convergence. Experimental results show that the test performance of FWBoost is not degraded with larger rounds in boosting, which is consistent with the theoretical analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Rates for Bandit Optimization with Upper-Confidence Frank-Wolfe

We consider the problem of bandit optimization, inspired by stochastic optimization and online learning problems with bandit feedback. In this problem, the objective is to minimize a global loss function of all the actions, not necessarily a cumulative loss. This framework allows us to study a very general class of problems, with applications in statistics, machine learning, and other fields. T...

متن کامل

Bandit Optimization with Upper-Confidence Frank-Wolfe

We consider the problem of bandit optimization, inspired by stochastic optimization and online learning problems with bandit feedback. In this problem, the objective is to minimize a global loss function of all the actions, not necessarily a cumulative loss. This framework allows us to study a very general class of problems, with applications in statistics, machine learning, and other fields. T...

متن کامل

A General Boosting Method and its Application to Learning Ranking Functions for Web Search

We present a general boosting method extending functional gradient boosting to optimize complex loss functions that are encountered in many machine learning problems. Our approach is based on optimization of quadratic upper bounds of the loss functions which allows us to present a rigorous convergence analysis of the algorithm. More importantly, this general framework enables us to use a standa...

متن کامل

On the Extensions of Frank - Wolfe Theorem

In this paper we consider optimization problems de ned by a quadratic objective function and a nite number of quadratic inequality constraints. Given that the objective function is bounded over the feasible set, we present a comprehensive study of the conditions under which the optimal solution set is nonempty, thus extending the so-called Frank-Wolfe theorem. In particular, we rst prove a gene...

متن کامل

Greedy Optimization and Applications to Structured Tensor Factorizations

Efficiently representing real world data in a succinct and parsimonious manner is of central importance in many fields. We present a generalized greedy framework, which allow us to efficiently solve generalized structured matrix and tensor factorization problems, where the factors are allowed to be from arbitrary sets of structured vectors. Such structure may include sparsity, nonnegativeness, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1510.02558  شماره 

صفحات  -

تاریخ انتشار 2015